filmov
tv
Santosh Vempala
0:49:30
Santosh Vempala: The KLS conjecture I
0:54:28
Reducing Isotropy to KLS: An Almost Cubic Volume Algorithm by Santosh Vempala
0:29:52
Santosh S Vempala, The Communication Complexity of Optimization
0:25:40
Santosh Vempala - Gibbs Sampling for Convex Bodies and an L_0 Isoperimetric Inequality
1:01:40
Panel Discussion: Open Problems in the Theory of Deep Learning
0:32:48
Santosh Vempala -- Multicriteria Dimensionality Reduction
1:31:08
Unsupervised Learning
0:52:48
Santosh Vempala: The KLS conjecture II
1:08:30
TCS+ talk: Santosh Vempala
0:59:28
Robustly Learning Mixtures of Arbitrary Gaussians in Polynomial Time by Santosh Vempala
1:12:50
Lifelong Learning of Representations with Provable Guarantees by Santosh Vempala (Georgia Tech)
0:43:59
Santosh Vempala (Gatech) -- Is There a Tractable (and Interesting) Theory of Nonconvex Optimization?
0:34:04
Fast Algorithms for Multivariate ODEs arising in Sampling and Learning
0:05:17
Georgia Tech professor discusses how generative AI works
0:42:36
Sampling Convex Bodies: A Status Report
0:32:51
Continuous Algorithms: Sampling and Optimization in High Dimension
0:08:44
Cortical Computation via Iterative Constructions
0:54:39
An SDP-based Algorithmic Cheeger Inequality for Vertex Expansion
0:49:25
Santosh Vempala: Beyond Moments: Robust certificates for affine transformations
0:58:31
On The Complexity of Training a Neural Network by Santhosh Vempala
0:57:17
High Dimensional Geometry and Concentration I
0:57:41
Subsampled Power Iteration: A Unified Algorithm for Block Models and Planted CSP's
1:02:53
High Dimensional Geometry and Concentration II
0:30:41
The Convergence of Hamiltonian Monte Carlo
Вперёд